36 research outputs found

    Solving the G-problems in less than 500 iterations: Improved efficient constrained optimization by surrogate modeling and adaptive parameter control

    Get PDF
    Constrained optimization of high-dimensional numerical problems plays an important role in many scientific and industrial applications. Function evaluations in many industrial applications are severely limited and no analytical information about objective function and constraint functions is available. For such expensive black-box optimization tasks, the constraint optimization algorithm COBRA was proposed, making use of RBF surrogate modeling for both the objective and the constraint functions. COBRA has shown remarkable success in solving reliably complex benchmark problems in less than 500 function evaluations. Unfortunately, COBRA requires careful adjustment of parameters in order to do so. In this work we present a new self-adjusting algorithm SACOBRA, which is based on COBRA and capable to achieve high-quality results with very few function evaluations and no parameter tuning. It is shown with the help of performance profiles on a set of benchmark problems (G-problems, MOPTA08) that SACOBRA consistently outperforms any COBRA algorithm with fixed parameter setting. We analyze the importance of the several new elements in SACOBRA and find that each element of SACOBRA plays a role to boost up the overall optimization performance. We discuss the reasons behind and get in this way a better understanding of high-quality RBF surrogate modeling

    Online Adaptable Time Series Anomaly Detection with Discrete Wavelet Transforms and Multivariate Gaussian Distributions

    Get PDF
    In this paper we present an unsupervised time series anomaly detection algorithm, which is based on the discrete wavelet transform (DWT) operating fully online. Given streaming data or time series, the algorithm iteratively computes the (causal and decimating) discrete wavelet transform. For individual frequency scales of the current DWT, the algorithm estimates the parameters of a multivariate Gaussian distribution. These parameters are adapted in an online fashion. Based on the multivariate Gaussian distributions, unusual patterns can then be detected across frequency scales, which in certain constellations indicate anomalous behavior. The algorithm is tested on a diverse set of 425 time series. A comparison to several other state-of-the-art online anomaly detectors shows that our algorithm can mostly produce results similar to the best algorithm on each dataset. It produces the highest average F1-score with one standard parameter setting. That is, it works more stable on high- and low-frequency-anomalies than all other algorithms. We believe that the wavelet transform is an important ingredient to achieve this

    Surrogate-Assisted Optimization for Augmentation of Finite Element Techniques

    Get PDF
    The application of finite element techniques for the analysis and optimization of complex thermo-mechanical structures typically involves highly nonlinear models for material characterization, tribological contact, large deformation, damage, etc. These nonlinearities usually call for a higher-order Spatio-temporal discretization, including a large number of elements and time-steps in order to provide good convergence and sufficiently accurate simulation results. This inevitably leads to many expensive simulations in terms of cost and time if an optimization or adaption of model parameters has to be done. In this work, a FEM simulation modeling approach is proposed, which uses radial basis function interpolations (RBF) as efficient surrogate models to save FEM simulations. Also, a surrogate-assisted optimization algorithm [3] is utilized to find the parameter setting, which would lead to maximum damage in a simple tensile testing scenario involving a notched specimen with as few FEM simulations as possible. The relatively high accuracy of the utilized surrogate models showcases promising results and indicates the potential of surrogate models in saving time-expensive simulations

    Online Adaptable Learning Rates for the Game Connect-4

    Get PDF
    Abstract-Learning board games by self-play has a long tradition in computational intelligence for games. Based on Tesauro's seminal success with TD-Gammon in 1994, many successful agents use temporal difference learning today. But in order to be successful with temporal difference learning on game tasks, often a careful selection of features and a large number of training games is necessary. Even for board games of moderate complexity like Connect-4, we found in previous work that a very rich initial feature set and several millions of game plays are required. In this work we investigate different approaches of online-adaptable learning rates like Incremental Delta Bar Delta (IDBD) or Temporal Coherence Learning (TCL) whether they have the potential to speed up learning for such a complex task. We propose a new variant of TCL with geometric step size changes. We compare those algorithms with several other state-of-the-art learning rate adaptation algorithms and perform a case study on the sensitivity with respect to their meta parameters. We show that in this set of learning algorithms those with geometric step size changes outperform those other algorithms with constant step size changes. Algorithms with nonlinear output functions are slightly better than linear ones. Algorithms with geometric step size changes learn faster by a factor of 4 as compared to previously published results on the task Connect-4

    Preliminary Study of Prospective ECG-Gated 320-Detector CT Coronary Angiography in Patients with Ventricular Premature Beats

    Get PDF
    BACKGROUND: To study the applicability of prospective ECG-gated 320-detector CT coronary angiography (CTCA) in patients with ventricular premature beats (VPB), and determine the scanning mode that best maximizes image quality and reduces radiation dose. METHODS: 110 patients were divided into a VPB group (60 cases) and a control group (50 cases) using CTCA. All the patients then underwent coronary angiography (CAG) within one month. CAG served as a reference standard through which the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of CTCA in diagnosing significant coronary artery stenosis (luminal stenosis ≥50%) could be analyzed. The two radiologists with more than 3 years' experience in cardiac CT each finished the image analysis after consultation. A personalized scanning mode was adopted to compare image quality and radiation dose between the two groups. METHODOLOGY/PRINCIPAL FINDINGS: At the coronary artery segment level, sensitivity, specificity, PPV, and NPV in the premature beat group were 92.55%, 98.21%, 88.51%, and 98.72% respectively. In the control group these values were found to be 95.79%, 98.42%, 90.11%, and 99.28% respectively. Between the two groups, specificity, sensitivity PPV, NPV was no significant difference. The two groups had no significant difference in image quality score (P>0.05). Heart rate (77.20±12.07 bpm) and radiation dose (14.62±1.37 mSv) in the premature beat group were higher than heart rate (58.72±4.73 bpm) and radiation dose (3.08±2.35 mSv) in the control group. In theVPB group, the radiation dose (34.55±7.12 mSv) for S-field scanning was significantly higher than the radiation dose (15.10±1.12 mSv) for M-field scanning. CONCLUSIONS/SIGNIFICANCE: With prospective ECG-gated scanning for VPB, the diagnostic accuracy of coronary artery stenosis is very high. Scanning field adjustment can reduce radiation dose while maintaining good image quality. For patients with slow heart rates and good rhythm, there was no statistically significant difference in image quality
    corecore